Nonlinear backpropagation: doing backpropagation without derivatives of the activation function
نویسندگان
چکیده
منابع مشابه
Nonlinear backpropagation: doing backpropagation without derivatives of the activation function
The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of fee...
متن کاملBackpropagation without Multiplication
Hans Peter Graf AT&T Bell Laboratories Holmdel, NJ 07733 The back propagation algorithm has been modified to work without any multiplications and to tolerate comput.ations with a low resolution, which makes it. more attractive for a hardware implementatioll. Numbers are represented in float.ing point format with 1 bit mantissa and 3 bits in the exponent for the states, and 1 bit mantissa and 5 ...
متن کاملBackpropagation generalized for output derivatives
Backpropagation algorithm is the cornerstone for neural network analysis. Paper extends it for training any derivatives of neural network’s output with respect to its input. By the dint of it feedforward networks can be used to solve or verify solutions of partial or simple, linear or nonlinear differential equations. This method vastly differs from traditional ones like finite differences on a...
متن کاملHardware Implementation of the Backpropagation without Multiplication
The back propagation algorithm has been modi ed to work without any multiplications and to tolerate computations with a low resolution, which makes it more attractive for a hardware implementation. Numbers are represented in oating-point format with 1 bit mantissa and 2 bits in the exponent for the states, and 1 bit mantissa and 4 bit exponent for the gradients, while the weights are 16 bit xed...
متن کاملA New Backpropagation Algorithm without Gradient Descent
The backpropagation algorithm, which had been originally introduced in the 1970s, is the workhorse of learning in neural networks. This backpropagation algorithm makes use of the famous machine learning algorithm known as Gradient Descent, which is a first-order iterative optimization algorithm for finding the minimum of a function. To find a local minimum of a function using gradient descent, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Neural Networks
سال: 1997
ISSN: 1045-9227,1941-0093
DOI: 10.1109/72.641455